Understanding Hidden Memories of Recurrent Neural Networks

نویسندگان

  • Yao Ming
  • Shaozu Cao
  • Ruixiang Zhang
  • Zhen Li
  • Yuanzhe Chen
  • Yangqiu Song
  • Huamin Qu
چکیده

Recurrent neural networks (RNNs) have been successfully applied to various natural language processing (NLP) tasks and achieved better results than conventional methods. However, the lack of understanding of the mechanisms behind their effectiveness limits further improvements on their architectures. In this paper, we present a visual analytics method for understanding and comparing RNN models for NLP tasks. We propose a technique to explain the function of individual hidden state units based on their expected response to input texts. We then co-cluster hidden state units and words based on the expected response and visualize co-clustering results as memory chips and word clouds to provide more structured knowledge on RNNs’ hidden states. We also propose a glyph-based *e-mail: {ymingaa, scaoad, rzhangav, zhen, ychench, yqsong, huamin}@ust.hk sequence visualization based on aggregate information to analyze the behavior of an RNN’s hidden state at the sentence-level. The usability and effectiveness of our method are demonstrated through case studies and reviews from domain experts.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A neural network with a single recurrent unit for associative memories based on linear optimization

Recently, some continuous-time recurrent neural networks have been proposed for associative memories based on optimizing linear or quadratic programming problems. In this paper, a simple and efficient neural network with a single recurrent unit is proposed for realizing associative memories. Compared with the existing neural networks for associative memories, the main advantage of the proposed ...

متن کامل

Application of artificial neural networks on drought prediction in Yazd (Central Iran)

In recent decades artificial neural networks (ANNs) have shown great ability in modeling and forecasting non-linear and non-stationary time series and in most of the cases especially in prediction of phenomena have showed very good performance. This paper presents the application of artificial neural networks to predict drought in Yazd meteorological station. In this research, different archite...

متن کامل

Multi-recurrent Networks for Traac Forecasting

Recurrent neural networks solving the task of short-term traac forecasting are presented in this report. They turned out to be very well suited to this task, they even outperformed the best results obtained with conventional statistical methods. The outcome of a comparative study shows that multiple combinations of feedback can greatly enhance the network performance. Best results were obtained...

متن کامل

Multi-recurrent Networks for Traffic Forecastin

Recurrent neural networks solving the task of shortterm traffic forecasting are presented in this report. They turned out to be very weII suited to this task, they even outperformed the best results obtained with conventional statistical methods. The outcome of a comparative study shows that multiple combinations of feedback can greatly enhance the network performance. Best results were obtaine...

متن کامل

Continuous Attractors in Recurrent Neural Networks and Phase Space Learning

Recurrent networks can be used as associative memories where the stored memories represent fixed points to which the dynamics of the network converges. These networks, however, also can present continuous attractors, as limit cycles and chaotic attractors. The use of these attractors in recurrent networks for the construction of associative memories is argued. Here, we provide a training algori...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1710.10777  شماره 

صفحات  -

تاریخ انتشار 2017